digital twins and product ontogeny

context

A self-driving vehicle is a complex compositie product that will require diagnoses and anaysis across a wide variety of coniditons. What does that mean to your business if you are the leading manufacturer of intelligent sensors, transponders, controllers, vehicle network products and semiconductors, that enable the next generation of autonomous and connected driving solutions?

Each vehicle has hundreds of your products, each with its own lifecycle across a variety of systems, locations, and logistics processes. Each product has its own discrete set of intellectual property. Each is potentially pirated or counterfeit. Each with its own potential impact on safety, and process for corrective action. Each with its own “neighbor”, for example chips made from the same wafer, or in the same delivery batch. How can you ensure more agile, responsive and precise analyses across your supply chain, both up and downstream?

Our client issued datagenous and our professional services partner Semaku, this challenge. Today they use the datagenous service as a core component of their enterprise integration architecture to operationalize solutions that help define and trace quality issues across complex, dynamic processes and legacy systems for their global supply chain to perform root cause analysis in real time - A process that required weeks of dedicated analysis from quality experts.

approach

datagenous provided our client a coherent, minimal viable infrastrcutre (MVI) approach to iterativley develop, deploy, and process data services by:

  1. creating digitial twins of things by defining and attaching universal identifiers (UID), e.g. tokenzinazation
  2. defining worklfows for the lifecycle of things across legacy enterprise resources
  3. opppurtunistically instrumenting enterprise resources as programmatic, standards compliant data service endpoints
  4. opppurtunistically deploying agents that enable federated access to resource endpoints
  5. integrating end-points into a frame of reference that allows for provisioning, orchestration and governance of data services
  6. opppurtunistically optimizing data services deployment and configurations

findings

datagenous helped our client invert the traditional data modeling, API design management, engineering, and integration process by altering the economics of enterprise data service design and provisioning. While rapidly and incrementally developing consistent, and standards-based approach to enterprise data service development, our client was able to automate and intergate business processes and transactions at low risk, and cost. Instead of trying to first consolidate a compreheisive universal schema for a ditigal twins and a correspnding consolidated infrastrcuture, they were able to seed the digital twin in minimaly viable infrastrcture by embracing and extending existing capaciity, leveraging investments in highly automated shop floors at manufacturing locations in Europe and Asia, and reconciling new data APIs and data flows with relevant PLM, and logistics data.

For our client, using datagenous reduced time to implement solutions form 18 months to 3. It lowered risk by delivering loosely coupled operational analytics that insulated legacy systems from heavy or risky integrations. Our client achieved new operational insights through rapid integration and automation of root cause analysis, product provenance, and anticipatory impact analysis, directly affecting their bottom line. And, as part of this process, our client developed better understanding of their requirements for infrastrcuture optmizations guiding their big data investments.